On Improved Bounds for Probability Metrics and $f$-Divergences

نویسنده

  • Igal Sason
چکیده

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter, capacitory discrimination, chi-squared divergence, Chernoff information, Hellinger distance, relative entropy, and the total variation distance. The presentation is aimed to be self-contained. Index Terms – Bhattacharyya parameter, capacitory discrimination, Chernoff information, chi-squared divergence, f -divergence, Hellinger distance, relative entropy, total variation distance.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...

متن کامل

-Divergences and Related Distances

Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Bounds on f-Divergences and Related Distances

Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...

متن کامل

A Note on Integral Probability Metrics and φ-divergences

We study some connections between integral probability metrics [21] of the form, γF(P,Q) := sup f∈F ̨

متن کامل

Beyond Differential Privacy: Composition Theorems and Relational Logic for f-divergences between Probabilistic Programs

f -divergences form a class of measures of distance between probability distributions; they are widely used in areas such as information theory and signal processing. In this paper, we unveil a new connection between f -divergences and differential privacy, a confidentiality policy that provides strong privacy guarantees for private data-mining; specifically, we observe that the notion of α-dis...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1403.7164  شماره 

صفحات  -

تاریخ انتشار 2014